首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9734篇
  免费   743篇
  国内免费   3篇
  2021年   102篇
  2020年   74篇
  2019年   89篇
  2018年   115篇
  2017年   112篇
  2016年   176篇
  2015年   288篇
  2014年   351篇
  2013年   450篇
  2012年   582篇
  2011年   553篇
  2010年   350篇
  2009年   345篇
  2008年   452篇
  2007年   504篇
  2006年   441篇
  2005年   421篇
  2004年   384篇
  2003年   425篇
  2002年   424篇
  2001年   100篇
  2000年   85篇
  1999年   116篇
  1998年   137篇
  1997年   110篇
  1996年   93篇
  1995年   116篇
  1994年   103篇
  1993年   94篇
  1992年   87篇
  1991年   85篇
  1990年   81篇
  1989年   77篇
  1988年   80篇
  1987年   79篇
  1986年   69篇
  1985年   94篇
  1984年   92篇
  1983年   83篇
  1982年   111篇
  1981年   103篇
  1980年   98篇
  1979年   64篇
  1978年   84篇
  1977年   66篇
  1976年   82篇
  1975年   56篇
  1974年   65篇
  1969年   43篇
  1968年   51篇
排序方式: 共有10000条查询结果,搜索用时 93 毫秒
991.
An algorithm for the assignment of phosphorylation sites in peptides is described. The program uses tandem mass spectrometry data in conjunction with the respective peptide sequences to calculate site probabilities for all potential phosphorylation sites. Tandem mass spectra from synthetic phosphopeptides were used for optimization of the scoring parameters employing all commonly used fragmentation techniques. Calculation of probabilities was adapted to the different fragmentation methods and to the maximum mass deviation of the analysis. The software includes a novel approach to peak extraction, required for matching experimental data to the theoretical values of all isoforms, by defining individual peak depths for the different regions of the tandem mass spectrum. Mixtures of synthetic phosphopeptides were used to validate the program by calculation of its false localization rate versus site probability cutoff characteristic. Notably, the empirical obtained precision was higher than indicated by the applied probability cutoff. In addition, the performance of the algorithm was compared to existing approaches to site localization such as Ascore. In order to assess the practical applicability of the algorithm to large data sets, phosphopeptides from a biological sample were analyzed, localizing more than 3000 nonredundant phosphorylation sites. Finally, the results obtained for the different fragmentation methods and localization tools were compared and discussed.  相似文献   
992.
The extracellular hemoglobin multimer of the planorbid snail Biomphalaria glabrata, intermediate host of the human parasite Schistosoma mansoni, is presumed to be a 1.44 MDa complex of six 240 kDa polypeptide subunits, arranged as three disulfide-bridged dimers. The complete amino acid sequence of two subunit types (BgHb1 and BgHb2), and the partial sequence of a third type (BgHb3) are known. Each subunit encompasses 13 paralogus heme domains, and N-terminally a smaller plug domain responsible for subunit dimerization. We report here the recombinant expression of different functional fragments of BgHb2 in Escherichia coli, and of the complete functional subunits BgHb1 and BgHb2 in insect cells; BgHb1 was also expressed as disulfide-bridged dimer (480 kDa). Oxygen-binding measurements of the recombinant products show a P(50) of about 7 mmHg and the absence of a significant cooperativity or Bohr effect. The covalently linked dimer of BgHb1, but not the monomer, is capable to form aggregates closely resembling native BgHb molecules in the electron microscope.  相似文献   
993.
Little is known about the nature of post mortem degradation of proteins and peptides on a global level, the so-called degradome. This is especially true for nonneural tissues. Degradome properties in relation to sampling procedures on different tissues are of great importance for the studies of, for instance, post translational modifications and/or the establishment of clinical biobanks. Here, snap freezing of fresh (<2 min post mortem time) mouse liver and pancreas tissue is compared with rapid heat stabilization with regard to effects on the proteome (using two-dimensional differential in-gel electrophoresis) and peptidome (using label free liquid chromatography). We report several proteins and peptides that exhibit heightened degradation sensitivity, for instance superoxide dismutase in liver, and peptidyl-prolyl cis-trans isomerase and insulin C-peptides in pancreas. Tissue sampling based on snap freezing produces a greater amount of degradation products and lower levels of endogenous peptides than rapid heat stabilization. We also demonstrate that solely snap freezing related degradation can be attenuated by subsequent heat stabilization. We conclude that tissue sampling involving a rapid heat stabilization step is preferable to freezing with regard to proteomic and peptidomic sample quality.The evolving maturation of the field of proteomics has, in the same way as in genomics, highlighted the need of better sampling procedures and sample preparation methodologies to minimize the effect of post mortem alterations. The aspect of sample quality is not new in any way and is relevant in most biomedical fields but has only lately started to receive adequate attention. The main factors influencing sample quality is storage temperature of the body until tissue removal (foremost a problem in clinical settings and extraction of less accessible tissue samples from model organisms) and post mortem interval (PMI)1 (13). Post mortem degradation in during PMI is a well known compromising problem when studying endogenous peptides (2, 3) and has also been proven to affect the results of polypeptide (here defined as proteins larger than 10 kDa) studies (38). PMI degradation has mainly been studied on human or mouse brain tissue, using two-dimensional electrophoresis (2-DE), SDS-PAGE, and immunoblotting (1, 312). There are also a few proteomic studies on muscle tissue degradation in livestock (1316).We and others have previously explored the effect of focused microwave irradiation with regard to sample quality, demonstrating that this method is more reliable than snap freezing in liquid nitrogen, especially with regard to post-translational modification (PTM) stability (2, 3, 1720). An alternative method based on cryostat dissection with subsequent heat treatment through boiling has also been reported to improve endogenous peptide sample quality (21). Besides focused microwave irradiation, which is specifically used for rodent brain tissue sampling, we have also demonstrated the efficiency of rapid heat stabilization through conductivity with regard to sample degradation (3, 22). Although somewhat constrained by its dependence on how quickly the tissue is harvested from the body, the latter procedure has the added advantage that it can be used on any type of tissue and species, fresh as well as frozen. This study will compare effects of sampling procedures on the liver and pancreas degradome following rapid heat stabilization, the more traditional snap freezing, or the combination of snap freezing with subsequent heat stabilization.To summarize, this study investigated the effects of post mortem degradation in pancreas and liver. Both tissues are well studied because of their multiple functions in the body and their involvement in different diseases such as diabetes or hepatocarcinoma. Pancreas is especially interesting in this context as it displays endocrine secretion of peptides, and exocrine secretion of digestive enzymes, the later making it a protease rich tissue. We used both two-dimensional difference in gel electrophoresis (2D-DIGE) and label free liquid chromatography mass spectrometry (LC-MS) based differential peptide display (2, 18), the later to better investigate changes in small molecular fragment that are not easily detectable by gel-based methods. 2D-DIGE is an unrivaled methodology to characterize alterations in isoform patterns, which is an important aspect considering that post-translational modifications (PTMs) such as phosphorylations are especially sensitive to post mortem influence within a few minutes PMI (3). The peptidomics approach has been used in several studies to point out early post mortem changes and protein degradation that tissue undergo following sampling and is therefore a well-suited method (3, 18, 22).  相似文献   
994.
Integrated top-down bottom-up proteomics combined with on-line digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to high throughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications. Herein, we describe recent efforts toward efficient integration of bottom-up and top-down LC-MS-based proteomics strategies. Since most proteomics separations utilize acidic conditions, we exploited the compatibility of pepsin (where the optimal digestion conditions are at low pH) for integration into bottom-up and top-down proteomics work flows. Pressure-enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an off-line mode using a Barocycler or an on-line mode using a modified high pressure LC system referred to as a fast on-line digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results were compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultrarapid integrated bottom-up top-down proteomics strategy using a standard mixture of proteins and a monkey pox virus proteome.In-depth characterization and quantitation of protein isoforms, including post-translationally modified proteins, are challenging goals of contemporary proteomics. Traditionally, top-down (1, 2) and bottom-up (3, 4) proteomics have been two distinct analytical paths for liquid-based proteomics analysis. Top-down proteomics is the mass spectrometry (MS)-based characterization of intact proteins, whereas bottom-up proteomics requires a chemical or enzymatic proteolytic digestion of all proteins into peptides prior to MS analysis. Both strategies have their own strengths and challenges and can be thought of as complementary rather than competing analytical techniques.In a top-down proteomics approach, proteins are usually separated by one- or two-dimensional liquid chromatography (LC) and identified using high performance MS (5, 6). This approach is very attractive because it allows the identification of protein isoforms arising from various amino acid modifications, genetic variants (e.g. single nucleotide polymorphisms), mRNA splice variants, and multisite modifications (7) (e.g. specific histone modifications) as well as characterization of proteolytic processing events. However, there are several challenges that have limited the broad application of the approach. Typically, intact proteins are less soluble than their peptide complement, which effectively results in greater losses during various stages of sample handling (i.e. limited sensitivity). Similarly, proteins above ∼40–50 kDa in size are more difficult to ionize, detect, and dissociate in most high throughput MS work flows. Additionally, major challenges associated with MS data interpretation and sensitivity, especially for higher molecular mass proteins (>100 kDa) and highly hydrophobic proteins (e.g. integral membrane proteins), remain largely unsolved, thus limiting the applicability of top-down proteomics on a large scale.Bottom-up proteomics approaches have broad application because peptides are easier to separate and analyze via LC coupled with tandem mass spectrometry (MS/MS), offering a basis for more comprehensive protein identification. As this method relies on protein digestion (which produces multiple peptides for each protein), the sample complexity can become exceedingly large, requiring several dimensions of chromatographic separations (e.g. strong cation exchange and/or high pH reversed phase) prior to the final LC separation (typically reversed phase (RP)1 C18), which is oftentimes directly coupled with the mass spectrometer (3, 8). In general, the bottom-up analysis rarely achieves 100% sequence coverage of the original proteins, which can result in an incorrect/incomplete assessment of protein isoforms and combinatorial PTMs. Additionally, the digested peptides are not detected with uniform efficiency, which challenges and distorts protein quantification efforts.Because the data obtained from top-down and bottom-up work flows are complementary, several attempts have been made to integrate the two strategies (9, 10). Typically, these efforts have utilized extensive fractionation of the intact protein separation followed by bottom-up analysis of the collected fractions. Results so far have encouraged us to consider on-line digestion methods for integrating top-down and bottom-up proteomics in a higher throughput fashion. Such an on-line digestion approach would not only benefit in terms of higher sample throughput and improved overall sensitivity but would also allow a better correlation between the observed intact protein and its peptide digestion products, greatly aiding data analysis and protein characterization efforts.So far, however, none of the on-line integrated methods have proven robust enough for routine high throughput analyses. One of the reasons for this limited success relates to the choice of the proteolytic enzyme used for the bottom-up segment. Trypsin is by far the most widely used enzyme for proteome analyses because it is affordable (relative to other proteases), it has been well characterized for proteome research, and it offers a nice array of detectable peptides due to a fairly even distribution of lysines and arginines across most proteins. However, protein/peptide RPLC separations (optimal at low pH) are fundamentally incompatible with on-line trypsin digestion (optimal at pH ∼ 8) (11, 12). Therefore, on-line coupling of trypsin digestion and RPLC separations is fraught with technological challenges, and proposed solutions (12) have not proven to be robust enough for integration into demanding high throughput platforms.Our approach to this challenge was to investigate alternative proteases that may be more compatible with automated on-line digestion, peptide separation, and MS detection. Pepsin, which is acid-compatible (i.e. it acts in the stomach to initially aid in the digestion of food) (13), is a particularly promising candidate. This protease has previously been successfully used for the targeted analyses of protein complexes, hydrogen/deuterium exchange experiments (14, 15), and characterization of biopharmaceuticals (16, 17). Generally, pepsin preferentially cleaves the peptide bond located on the N-terminal side of hydrophobic amino acids, such as leucine and phenylalanine, although with less specificity than the preferential cleavage observed for trypsin at arginine and lysine. The compatibility of pepsin with typical LC-MS operation makes it an ideal choice for the development of novel approaches combining protein digestion, protein/peptide separation, and MS-based protein/peptide identification.To develop an automated system capable of simultaneously capturing top-down and bottom-up data, enzyme kinetics of the chosen protease must be extremely fast (because one cannot wait hours as is typical when performing off-line proteolysis). Another requirement is the use of immobilized enzyme or a low enough concentration of the enzyme such that autolysis products do not obscure the detection of substrate peptides. The latter was a concern when using pepsin because prior hydrogen/deuterium exchange experiments used enzyme:substrate ratios up to 1:2 (18, 19). To test whether or not such a large concentration of pepsin was necessary, we performed pepsin digestion at ratios of 1:20. Many alternative energy inputs into the system were considered for speeding up the digestion. For instance, it has been shown that an input of ultrasonic energy could accelerate the reaction rate of a typical trypsin digestion while using small amounts of a protease (20). Because ultrasonic energy results in an increase of temperature and microenvironments of high pressure, it has been hypothesized that the higher temperature was the component responsible for the enhanced enzyme activity (21). López-Ferrer et al. (22, 23), however, have demonstrated that application of higher pressure with incorporation of a Barocycler alone can make trypsin display faster enzyme kinetics. This phenomenon can easily be integrated with an LC separation (which already operates at elevated pressure) to enable an automatable ultrarapid on-line digestion LC-MS proteomics platform. Herein, we refer to this platform as the fast on-line digestion system (FOLDS) (23). Although FOLDS has been described before using trypsin, here the system is characterized with pepsin, and the results obtained are compared with results attainable with trypsin. Like trypsin, pepsin produced efficient protein digestion in just a few minutes when placed under pressure. Because of the natural maximal activity of pepsin at low pH, the FOLDS can be incorporated with a RePlay (Advion Biosciences, Ithaca, NY) system, and this powerful combination is what ultimately makes the integration of top-down and bottom-up proteomics analyses possible. The integrated analysis begins with a chromatographic separation of intact proteins. The separated proteins are then split into two streams. One stream proceeds directly to the mass spectrometer for MS and/or tandem MS analysis. The second stream is split into a long capillary where the chromatographic separation of the proteins is maintained, but their arrival to the mass spectrometer for detection is delayed. This is in essence the concept of RePlay (24, 25). Herein, we have taken the RePlay a step further by implementing our FOLDS technology into the second split delayed stream of proteins. While these delayed proteins travel down the long and narrow capillary, we exposed them to pepsin where, in combination with the pressure, the proteins are quickly and reproducibly digested. These peptide fragments are subsequently subjected to MS and/or tandem MS analysis. The FOLDS RePlay system allows the rapid and robust incorporation of the integrated top-down bottom-up proteomics work flow with the ability to not only identify proteins but also to sequence multisite/combinatorial PTMs because all detected peptides (from the FOLDS analysis) are confined to the original chromatographic peak of the protein they were derived from. The analysis of protein mixtures using this integrated strategy reduces the total amount of samples required to obtain both the top-down and bottom-up data, increases throughput, and improves protein sequence coverage.  相似文献   
995.
The high frequency of p53 mutation in human cancers indicates the important role of p53 in suppressing tumorigenesis. It is well established that the p53 regulates multiple, distinct cellular functions such as cell-cycle arrest and apoptosis. Despite intensive studies, little is known about which function is essential, or if multiple pathways are required, for p53-dependent tumor suppression in vivo. Using a mouse brain carcinoma model that shows high selective pressure for p53 inactivation, we found that even partially abolishing p53-dependent apoptosis by Bax inactivation was sufficient to significantly reduce the selective pressure for p53 loss. This finding is consistent with previous reports that apoptosis is the primary p53 function selected against during Eμ-myc-induced mouse lymphoma progression. However, unlike observed in the Eμ-myc-induced lymphoma model, attenuation of apoptosis is not sufficient to phenocopy the aggressive tumor progression associated with complete loss of p53 activity. We conclude that apoptosis is the primary tumor suppressive p53 function and the ablation of additional p53 pleiotropic effects further exacerbates tumor progression.  相似文献   
996.
Hard-wired, Pavlovian, responses elicited by predictions of rewards and punishments exert significant benevolent and malevolent influences over instrumentally-appropriate actions. These influences come in two main groups, defined along anatomical, pharmacological, behavioural and functional lines. Investigations of the influences have so far concentrated on the groups as a whole; here we take the critical step of looking inside each group, using a detailed reinforcement learning model to distinguish effects to do with value, specific actions, and general activation or inhibition. We show a high degree of sophistication in Pavlovian influences, with appetitive Pavlovian stimuli specifically promoting approach and inhibiting withdrawal, and aversive Pavlovian stimuli promoting withdrawal and inhibiting approach. These influences account for differences in the instrumental performance of approach and withdrawal behaviours. Finally, although losses are as informative as gains, we find that subjects neglect losses in their instrumental learning. Our findings argue for a view of the Pavlovian system as a constraint or prior, facilitating learning by alleviating computational costs that come with increased flexibility.  相似文献   
997.
998.
Translesion DNA synthesis (TLS) is a DNA damage tolerance mechanism in which specialized low-fidelity DNA polymerases bypass replication-blocking lesions, and it is usually associated with mutagenesis. In Saccharomyces cerevisiae a key event in TLS is the monoubiquitination of PCNA, which enables recruitment of the specialized polymerases to the damaged site through their ubiquitin-binding domain. In mammals, however, there is a debate on the requirement for ubiquitinated PCNA (PCNA-Ub) in TLS. We show that UV-induced Rpa foci, indicative of single-stranded DNA (ssDNA) regions caused by UV, accumulate faster and disappear more slowly in Pcna(K164R/K164R) cells, which are resistant to PCNA ubiquitination, compared to Pcna(+/+) cells, consistent with a TLS defect. Direct analysis of TLS in these cells, using gapped plasmids with site-specific lesions, showed that TLS is strongly reduced across UV lesions and the cisplatin-induced intrastrand GG crosslink. A similar effect was obtained in cells lacking Rad18, the E3 ubiquitin ligase which monoubiquitinates PCNA. Consistently, cells lacking Usp1, the enzyme that de-ubiquitinates PCNA exhibited increased TLS across a UV lesion and the cisplatin adduct. In contrast, cells lacking the Rad5-homologs Shprh and Hltf, which polyubiquitinate PCNA, exhibited normal TLS. Knocking down the expression of the TLS genes Rev3L, PolH, or Rev1 in Pcna(K164R/K164R) mouse embryo fibroblasts caused each an increased sensitivity to UV radiation, indicating the existence of TLS pathways that are independent of PCNA-Ub. Taken together these results indicate that PCNA-Ub is required for maximal TLS. However, TLS polymerases can be recruited to damaged DNA also in the absence of PCNA-Ub, and perform TLS, albeit at a significantly lower efficiency and altered mutagenic specificity.  相似文献   
999.
1000.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号